NTISthis.com

Evidence Guide: ICTTEN7226A - Manage development and application of testing artefacts

Student: __________________________________________________

Signature: _________________________________________________

Tips for gathering evidence to demonstrate your skills

The important thing to remember when gathering evidence is that the more evidence the better - that is, the more evidence you gather to demonstrate your skills, the more confident an assessor can be that you have learned the skills not just at one point in time, but are continuing to apply and develop those skills (as opposed to just learning for the test!). Furthermore, one piece of evidence that you collect will not usualy demonstrate all the required criteria for a unit of competency, whereas multiple overlapping pieces of evidence will usually do the trick!

From the Wiki University

 

ICTTEN7226A - Manage development and application of testing artefacts

What evidence can you provide to prove your understanding of each of the following citeria?

Plan the test effort and develop test strategy for software testing

  1. Analyse the functionalities of an application software from the system design document and create a test strategy for a new telecommunications product line
  2. Produce the steps in developing the test strategy and the attributes of each step according to enterprise policy
  3. Produce a test strategy to evaluate the suitability of the application software for integration into the telecommunications network
  4. Assess a range of tests required to evaluate the performance and functionality of the application software and determine the tests required to suit the test regime
  5. Evaluate features of testing tool and debuggers and select an appropriate tool to test the software application and detect faults
  6. Produce a test plan based on the requirements of the project specifications and identify testing artefacts required for model based software testing
Analyse the functionalities of an application software from the system design document and create a test strategy for a new telecommunications product line

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce the steps in developing the test strategy and the attributes of each step according to enterprise policy

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce a test strategy to evaluate the suitability of the application software for integration into the telecommunications network

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Assess a range of tests required to evaluate the performance and functionality of the application software and determine the tests required to suit the test regime

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Evaluate features of testing tool and debuggers and select an appropriate tool to test the software application and detect faults

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce a test plan based on the requirements of the project specifications and identify testing artefacts required for model based software testing

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Plan the development of testing artefacts

  1. Develop software testing strategies for uncovering evidence of defects in software systems as part of the quality assurance process
  2. Produce a test dependency model of the relationship between the test regime and the test levels in software testing for referencing and validation
  3. Analyse the software testing requirements to determine the domain testing and application testing artefacts requirements to validate the software product
Develop software testing strategies for uncovering evidence of defects in software systems as part of the quality assurance process

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce a test dependency model of the relationship between the test regime and the test levels in software testing for referencing and validation

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Analyse the software testing requirements to determine the domain testing and application testing artefacts requirements to validate the software product

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Develop and manage testing artefacts

  1. Create reusable domain artefacts to detect early defects in domain testing
  2. Modify domain test artefacts by binding the variability to create application test artefacts to detect defects in product line applications
  3. Produce test reporting associated with phases of a test cycle within test plan
  4. Analyse test reports and evaluate the impact of test plan with the testing artefacts on the testing environment
  5. Manage the progress of test plan to minimise the risks associated with testing and ensure test complies with test requirements
  6. Report detected defects to product evaluation personnel and prepare a strategy manage the defects
  7. Analyse the testing metrics produced by the test tool to manage the tracking of defects and use metrics to plan process improvements
  8. Produce an evaluation report from the traceability matrix to manage the defects or failures from the test plan
Create reusable domain artefacts to detect early defects in domain testing

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Modify domain test artefacts by binding the variability to create application test artefacts to detect defects in product line applications

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce test reporting associated with phases of a test cycle within test plan

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Analyse test reports and evaluate the impact of test plan with the testing artefacts on the testing environment

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Manage the progress of test plan to minimise the risks associated with testing and ensure test complies with test requirements

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Report detected defects to product evaluation personnel and prepare a strategy manage the defects

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Analyse the testing metrics produced by the test tool to manage the tracking of defects and use metrics to plan process improvements

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Produce an evaluation report from the traceability matrix to manage the defects or failures from the test plan

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Assessed

Teacher: ___________________________________ Date: _________

Signature: ________________________________________________

Comments:

 

 

 

 

 

 

 

 

Instructions to Assessors

Evidence Guide

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment

Critical aspects for assessment and evidence required to demonstrate competency in this unit

Evidence of the ability to:

produce the steps in developing the test strategy

produce a test strategy to evaluate the suitability of the application software

develop software testing strategies for uncovering evidence of defects in software systems

create reusable domain artefacts to detect early defects in domain testing

produce test reporting associated with phases of a test cycle within test plan

produce an evaluation report from the traceability matrix .

Context of, and specific resources for assessment

Assessment must ensure:

site where testing artefacts can be developed and applied

software tools currently used in industry

vendor products, specifications, equipment and enterprise policy required for the activity.

Methods of assessment

A range of assessment methods should be used to assess practical skills and knowledge. The following examples are appropriate for this unit:

direct observation of the candidate managing development and application of testing artefacts

review of tests plans, reports and evaluation documents prepared by the candidate

oral or written questioning of the candidate to assess required knowledge.

Guidance information for assessment

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, for example:

ICTTEN7225A Manage network testing strategies.

Aboriginal people and other people from a non-English speaking background may have second language issues.

Access must be provided to appropriate learning and assessment support when required.

Assessment processes and techniques must be culturally appropriate, and appropriate to the oral communication skill level, and language and literacy capacity of the candidate and the work being performed.

In all cases where practical assessment is used it will be combined with targeted questioning to assess required knowledge. Questioning techniques should not require language, literacy and numeracy skills beyond those required in this unit of competency.

Where applicable, physical resources should include equipment modified for people with special needs.

Required Skills and Knowledge

Required skills

analytical skills to evaluate product and technology needs

communication skills to:

interact with enterprise personnel, customers and other contractors, while maintaining a customer focus and consideration of customer needs

liaise with internal and external personnel on technical and operational matters

literacy skills to:

prepare reports given a specific format

read and interpret technical documentation, software and hardware manuals, specifications and relevant enterprise policy

numeracy skills to take and analyse measurements

planning and organisational skills to:

break large projects into a series of small projects

manage and prioritise own work

organise testing

problem solving skills to resolve software, hardware and logistics problems

safety awareness skills to follow all related occupational health and safety (OHS) requirements and work practices

task management skills to work systematically with required attention to detail and adherence to project requirements

technical skills to:

configure software testing tools

create traceability matrix

develop software testing strategies

Required knowledge

applications software features and functionalities

configuration of software testing tools

creation of testing artefacts

creation of traceability matrix

domain and application testing

elements of test plans

features of testing tools and debuggers

identification of types of risks

management of:

software defects

test traceability

phases of test cycles

production of test reviews

steps in developing test strategy

testing procedures

Range Statement

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Application software may be for:

firmware

games software

media centre

mobile phone

operating system

set-top box

sever

web application

wireless modem.

Test strategy may include:

communicating test plans to stakeholders and obtain buy-in from business clients

coordinating test environment and data requirements before starting phase

defining objectives, timelines, and approach for test effort

defining test activities with roles and responsibilities.

Steps in developing the test strategy may include:

change management:

models for assessing impact of changes on testing

plan for managing requirement changes

process for keeping test artefacts in sync with development artefacts

configuration management:

list of testing artefacts

tools and techniques for configuration management

communication and status reporting

testing approach:

methodology for test development and execution

planning for test execution cycles

specification of test environment set-up

testing process life cycle

testing templates, checklist and guidelines

test automation:

criteria for feasibility of test automation

defect management

test automation strategy

test tool identification

test environment specifications:

configuration management, maintenance of test bed and build management

hardware and software requirements

test data creation

testing metrics:

metric to match strategic objectives

plan of process improvement based on metrics

techniques for collecting metrics

tools to gather and analyse metrics

types of testing:

different phases of testing required

different types of testing

test coverage

objective and scope of testing:

business objectives

extent of application to be tested

goals to be met by testing effort

what systems and components need to be tested.

Range of tests may include:

compatibility

on demand testing

performance

process improvement

technical

test management

verification.

Features of testing tool and debuggers may include:

automated functional graphical user interface (GUI) testing tool

benchmarks

formatted dump or symbolic debugging

performance analysis or profiling tool

program monitors;

code coverage reports

instruction set simulator

program animation.

Test plan may include:

acceptance or commissioning test

design verification or compliance test

manufacturing or production test

regression test.

Testing artefacts produced by software testing may include:

domain or application testing

test case

test data

test harness

test plan

test scripts

test suite

traceability matrix.

Software testing strategies may refer to:

defects when software system does not behave as specified

test level requirements

variability of product line.

Defects may be:

deferred

fixed

rejected

treated.

Test levels refer to:

integration test of multiple element that form a configuration specified in architecture

system test that validates behaviour of whole system

unit test of single element.

Domain testing features may:

aim at testing common parts and preparing test artefacts for variable parts

create reusable test artefacts for application testing

deal with variability of various domains

uncover evidence of defects in domain artefacts.

Application testing features may:

aim at reusing test artefacts for common parts and reuse predefined variable domain test artefacts to test specific applications

reuse domain test artefacts to uncover evidence of defects in product line applications.

Phases of a test cycle may include:

defect retesting

regression testing

requirements analysis

test closure

test development

test execution

test planning

test reporting

test result analysis.